40 research outputs found

    Stochastic circuit breaker network model for bipolar resistance switching memories

    Get PDF
    Abstract:We present a stochastic model for resistance switching devices in which a square grid of resistor breakers plays the role of the insulator switching layer. The probability of breaker switching between two fixed resistance values, ROFF and RON, is determined by the corresponding voltage drop and thermal Joule heating. The breaker switching produces the overall device resistance change. Salient features of all the switching operations of bipolar resistance switching memories (RRAMs) are reproduced by the model and compared to a prototypical HfO2-based RRAM device. In particular, the need of a forming process that leads a fresh highly insulating device to a low resistance state (LRS) is captured by the model. Moreover, the model is able to reproduce the RESET process, which partially restores the insulating state through a gradual resistance transition as a function of the applied voltage and the abrupt nature of the SET process that restores the LRS. Furthermore, the multilevel capacity of a typical RRAM device obtained by tuning RESET voltage and SET compliance current is reproduced. The manuscript analyses the peculiar ingredients of the model and their inuence on the simulated current-voltage curves and, in addition, provides a detailed description of the mechanisms that connect the switching of the single breakers and that of the overall device

    Flood Risk Estimation through Document Sources Analysis: the Case of the Amalfi Rocky Coast

    Get PDF
    In the last century the Amalfi Coast was affected by numerous severe floods in conjunction with exceptional rainfall that caused major damage in terms of lost lives and economic cost. Historical documentary sources are an important source of information for reconstructing exceptional flood events occurring prior to the instrumental era. Historical analysis also provides an opportunity to extend the time scale window for flood risk studies. To study historical floods we collected all the available information concerning the period between the 16th and the 20th centuries by analysing both published and unpublished sources. The great variety of historical sources made it necessary to formulate an ad hoc scientific procedure that takes into account not only the completeness and reliability of documents related to the period, but also the intrinsic quality of the material. Experience in historical data collection shows that not all documentary sources can provide useful information for flood characterization, but it is necessary to have a selective criteria in order to obtain the best information rather than the best dataset quality. Analysis of the data in question allowed us to achieve a chronological reconstruction of more than 100 floods. In this task, the level of information was decisive to carry out spaceā€“time identification, estimate the affected area and define type of damage to public and private structures, and the geological effects induced

    Analog Memristive Synapse in Spiking Networks Implementing Unsupervised Learning

    Get PDF
    Emerging brain-inspired architectures call for devices that can emulate the functionality of biological synapses in order to implement new efficient computational schemes able to solve ill-posed problems. Various devices and solutions are still under investigation and, in this respect, a challenge is opened to the researchers in the field. Indeed, the optimal candidate is a device able to reproduce the complete functionality of a synapse, i.e. the typical synaptic process underlying learning in biological systems (activity-dependent synaptic plasticity). This implies a device able to change its resistance (synaptic strength, or weight) upon proper electrical stimuli (synaptic activity) and showing several stable resistive states throughout its dynamic range (analog behavior). Moreover, it should be able to perform spike timing dependent plasticity (STDP), an associative homosynaptic plasticity learning rule based on the delay time between the two firing neurons the synapse is connected to. This rule is a fundamental learning protocol in state-of-art networks, because it allows unsupervised learning. Notwithstanding this fact, STDP-based unsupervised learning has been proposed several times mainly for binary synapses rather than multilevel synapses composed of many binary memristors. This paper proposes an HfO2-based analog memristor as a synaptic element which performs STDP within a small spiking neuromorphic network operating unsupervised learning for character recognition. The trained network is able to recognize five characters even in case incomplete or noisy characters are displayed and it is robust to a device-to-device variability of up to +/-30%

    Physical Implementation of a Tunable Memristor-based Chua's Circuit

    Full text link
    Nonlinearity is a central feature in demanding computing applications that aim to deal with tasks such as optimization or classification. Furthermore, the consensus is that nonlinearity should not be only exploited at the algorithm level, but also at the physical level by finding devices that incorporate desired nonlinear features to physically implement energy, area and/or time efficient computing applications. Chaotic oscillators are one type of system powered by nonlinearity, which can be used for computing purposes. In this work we present a physical implementation of a tunable Chua's circuit in which the nonlinear part is based on a nonvolatile memristive device. Device characterization and circuit analysis serve as guidelines to design the circuit and results prove the possibility to tune the circuit oscillatory response by electrically programming the device.Comment: Accepted by IEEE 48th European Solid State Circuits Conference (ESSCIRC 2022

    2022 roadmap on neuromorphic computing and engineering

    Full text link
    Modern computation based on von Neumann architecture is now a mature cutting-edge science. In the von Neumann architecture, processing and memory units are implemented as separate blocks interchanging data intensively and continuously. This data transfer is responsible for a large part of the power consumption. The next generation computer technology is expected to solve problems at the exascale with 1018^{18} calculations each second. Even though these future computers will be incredibly powerful, if they are based on von Neumann type architectures, they will consume between 20 and 30 megawatts of power and will not have intrinsic physically built-in capabilities to learn or deal with complex data as our brain does. These needs can be addressed by neuromorphic computing systems which are inspired by the biological concepts of the human brain. This new generation of computers has the potential to be used for the storage and processing of large amounts of digital information with much lower power consumption than conventional processors. Among their potential future applications, an important niche is moving the control from data centers to edge devices. The aim of this roadmap is to present a snapshot of the present state of neuromorphic technology and provide an opinion on the challenges and opportunities that the future holds in the major areas of neuromorphic technology, namely materials, devices, neuromorphic circuits, neuromorphic algorithms, applications, and ethics. The roadmap is a collection of perspectives where leading researchers in the neuromorphic community provide their own view about the current state and the future challenges for each research area. We hope that this roadmap will be a useful resource by providing a concise yet comprehensive introduction to readers outside this field, for those who are just entering the field, as well as providing future perspectives for those who are well established in the neuromorphic computing community

    Prima che si perda la memoria: viaggio iconografico in Irpinia tra dissesti e terremoti/Before memory is gone: an iconographic journey among the landslides and earthquakes of Irpinia, Italy

    No full text
    The memory of a country, once passed down by our elders, is often lost today, in spite of the new forms of multimedia communication. Among the causes frenetic schedules, and different approaches to time, space and past. This why we suggest a visit to the parts of Irpinia (Campania, Italy) where present and past memory are assigned to a project in which photography and iconography are the main protagonists. The concept of the voyage is to traverse the historical landscape affected by the natural disasters which have fundamentally altered the regional makeup. The journey begins from Melito Irpino (province of Avellino), considered a village at risk since the early twentieth century, and subject to calamities such as landslides, the earthquakes of 1930 and 1962, and the 1949 flood. Ultimately, the events of 1962 led to abandonment and reconstruction of the community at another site. Using images, we attempt to reconstruct the old and new identity of the ā€œMelito Irpino Paeseā€

    The 1976 Guatemala Earthquake: ESI Scale and Probabilistic/Deterministic Seismic Hazard Analysis Approaches

    No full text
    A hazard assessment of the 1976 Guatemala earthquake (M = 7.5) was conducted to achieve a better definition of the seismic hazard. The assessment was based on the environmental effects that had effectively contributed to the high destructive impact of that event. An interdisciplinary approach was adopted by integrating: (1) historical data; (2) co-seismic geological effects in terms of Environmental Seismic Intensity (ESI) scale intensity values; and (3) ground shaking data estimated by a probabilistic/deterministic approach. A detailed analysis of primary and secondary effects was conducted for a set of 24 localities, to obtain a better evaluation of seismic intensity. The new intensity values were compared with the Modified Mercalli Intensity (MMI) and Peak Ground Acceleration (PGA) distribution estimated using a probabilistic/deterministic hazard analysis approach for the target area. Our results are evidence that the probabilistic/deterministic hazard analysis procedures may result in very different indications on the PGA distributions. Moreover, PGA values often display significant discrepancy from the macroseismic intensity values calculated with the ESI scale. Therefore, the incorporation of the environmental earth effects into the probabilistic/deterministic hazard analysis appears to be mandatory in order to achieve a more accurate seismic estimation
    corecore